Evidence-Based Practice in Computer Access Services
Thursday, October 18, 2007 8:00 am - 9:00 am
Learn how to gather evidence to enhance your assistive technology services. Focusing on the area of access assessment, we will present the concept of evidence-based practice and describe tools that can help practitioners gather useful evidence about their clients' abilities. We will also demonstrate a particular assessment tool called Compass, as applied to particular scenarios. Compass measures performance in computer access skills, such as text entry, pointing device use, and switch use. It collects speed and accuracy data during test performance and reports the results. Come and see how to gather the evidence you need for effective access interventions.
Carmen P. DiGiovine, Rehabilitation Engineering Technologist, 6 Degrees of Freedom, LLC, Wheaton, IL .
Heidi Koester, President, Koester Performance Research, Ann Arbor, MI .







Tools and Methods for Assessment in Computer Access

Heidi Horstmann Koester and Edmund LoPresti, Koester Performance Research
The Compass Project; www.kpronline.com; hhk@kpronline.com
The purpose of this document is to briefly review current approaches in AT assessment, with a particular emphasis on computer access assessment. This review was performed in support of the Compass project, to provide a perspective on what was already available as we developed a new assessment tool.

General Assistive Technology Assessment Tools

Matching Person and Technology (members.aol.com/IMPT97/MPT.html)
The Matching Person and Technology assessment focuses on three major elements (Milieu, Person, Technology) that impact assistive technology use. The Milieu refers to characteristics of the settings in which AT is to be used. The Person refers to the user’s personal characteristics and preferences. The Technology component focuses on specific characteristics of the technology itself, including design and funding. These three areas are assessed using questionnaires directed to both the clinician to the client.
Although one of the MPT instruments addresses the needs of school-age clients, the MPT is primarily directed toward the needs of adults. A second assessment tool was developed specifically for children, the Matching Assistive Technology and CHild or MATCH assessment. Goals include both choosing the most appropriate technology and deciding on the most appropriate training strategies to ensure optimal use of a technology.

SETT (www2.edc.org/ncip/workshops/sett/SETT_home.html)
The SETT model (Student, Environment, Tasks, Tools) provides a guide for evaluating a person, his or her environment, and his or her desired tasks in order to identify tools to meet the person’s needs. As the name implies, the SETT is directed primarily to students. SETT is not intended as a strict protocol but a general framework, which may incorporate other assessment tools.

Considering Assistive Technology: A Flowchart of Primary Questions (Chambers, 1997)
This tool uses a flowchart to guide professionals through consideration of assistive technology by asking a series of questions. As with SETT, this tool is primarily intended for students and, specifically, is intended to assist in the IEP process. The tool stresses that consideration is an ongoing process and that review will be needed as the student’s needs change, as the environment changes, and as more advanced technology becomes available.

Assessing Students for Assistive Technology

A number of organizations and states have developed guides for IEP teams, to facilitate assistive technology assessments. These include:
  1. Quality Indicators for Assistive Technology Services (QIAT Consortium. www.qiat.org)
  2. Assistive Technology: Creating a Pathway (Iowa Dept of Educ. Bureau of Children, Family, & Community Services. http://www.aea11.k12.ia.us/att/assistive_technologygene.htm)
  3. Partnerships for AT with Indiana Schools (PATINS) Case Conference Committee Consideration for AT (CCAT)__ (www.patinsproject.com/)
  4. • Assessing Student's Needs for Assistive Technology (Wisconsin Assistive Tech Initiative (WATI), www.wati.org)
  5. • AT Screener© (Technology and Inclusion, TX, www.taicenter.com/atscreener.html)

Education Tech Points (Reed and Bowser 1998, www.edtechpoints.org)

Education Tech Points consists of a manual that includes information on team-building, components of effective AT service delivery, and systems change. Assessment forms are provided as well.

Assessment Tools, page 1, updated: June 8, 2006

Computer Access Assessment Tools

The assessment tools described above include consideration of alternative computer input devices, either explicitly or implicitly. Given the large number of options for computer access tools, some clinicians have developed assessment tools focusing exclusively on this area. Table 1 summarizes the available tools that focus on computer access assessment.
||
||
||
||
||
||
||
||
||
Tool
Recommends Assessment Protocol
Automatic Test Presentation
Automatic Data Collection
Automatic Report Generation
Automatic Recommendations
Skills Evaluated
Alternative Computer Access: A Guide to Selection
Y
N
N
N
N
Use of keyboard, mouse, and alternative pointing devices; sensory abilities; cognitive abilities
Control of Computer-Based Tech. for People with Physical Disabilities
Y
N
N
N
N
Background; environment; keyboard, mouse, and switch skills.
Lifespace
Y
N
N
Y
N
Physical, cognitive, emotional, support resources and environmental characteristics
Assessment of Computer Task Performance
Y
N
N
N
N
Keyboard and mouse skills
EvaluWare
N
Y
N
Y
N
Looking, listening, mouse, switch, some keyboard skills
Compass
N
Y
Y
Y
N
Keyboard, mouse, and switch skills
Custom Solutions
N
Y
Y
Y
N
Keyboard and mouse skills
Single Switch Performance Test
N
Y
Y
Y
N
Single switch activation skills
Computer Access Selector
Y
N
N
Y
Y
Finds systems that match criteria for an appropriate computer access device
||

Table 1. Computer Access Assessment Tools. This includes books, guides, and software programs designed to assist clinicians in assessing clients’ computer usage skills. Each is described in more detail in the text below.

Alternative Computer Access: A Guide to Selection (Anson 1997)

This book presents decision trees for performing a computer access evaluation and evaluating the impact of assistive technology. The decision tree asks a series of questions which guide the clinician through the assessment process. Questions prompt the clinician to evaluate the client’s fine motor abilities related to use of the keyboard, mouse, and alternative pointing devices; sensory abilities; and cognitive abilities. Individual chapters describe types of assistive technologies, with case studies and tables of available technologies.

Assessment Tools, page 2, updated: June 8, 2006

Control of Computer-Based Technology for People with Physical Disabilities: An Assessment Manual. (Lee and Thomas 1990).
This manual presents computer access assessment within the context of a complete assessment protocol. The manual includes data collection instruments, procedures for client observations, testing procedures for various computer access methods, guidelines to match client needs and device characteristics, and guidelines to customize, train, implement, and monitor progress.

Lifespace Access Profile (Williams et al.1995; Patten and Stemach 1998)
Don Johnston, http://www.donjohnston.com/catalog/lifespace.htm
The Lifespace Access Profile is a team-based computer access assessment tool. The Profile evaluates a client’s current abilities across five domains:
    1. 1. Physical resources (general health, vision, hearing, tactile sensation, postural control, coordination, mobility support, range of motion, body sites for switch access)
    2. 2. Cognitive resources (ability to understand cause and effect; methods for discriminating between switches; communication skills)
    3. 3. Emotional resources (reinforcers, attention span, distractibility, tolerance for change)
    4. 4. Support resources (family members and professionals with adequate training and time to implement assistive technology)
    5. 5. Environmental analysis (individual’s level of participation across environments, such as home, school, community, and places of work and recreation; and use of AT within those environments)
For each domain, team members rate the client from 1 to 10 based on standardized criteria for a variety of behaviors and abilities. Lifespace was originally intended for clients with severe or multiple disabilities, especially those with cognitive impairments. An "upper extension" has since been developed for clients with normal cognitive abilities.
The Profile was originally a paper-and-pencil test with a series of forms. A computer-based version is now available in with all instructions, worksheets and forms for detailed student assessments stored on a CD. The tests are still administered manually, with the clinician performing assessments or performing observations to measure the client’s abilities and limitations. The computer version allows the clinician to complete the worksheets on the computer rather than on paper, and the computer can then generate a master record for each student. This facilitated updating records, tracking progress across time, and comparing performance with different interventions.

Assessment of Computer Task Performance (Dumont, Vincent, Mazer 2002; Mazer, Dumont, Vincent, 2003)
The Assessment of Computer Task Performance provides a series of standardized tests and standardized measurement criteria for assessing computer skills. It evaluates simple keyboard skills, simple pointing device skills, and combined actions (e.g. moving the cursor to a location and clicking the mouse button; drag and drop of onscreen objects). The tests are administered by the clinician, using icons present on the computer desktop and standard word processing software. The assessment tool comes with a series of transparencies which can be placed on the computer monitor. These transparencies support the tests; for example, indicating a path which the client should follow with the cursor.
The outcome of each test is measured by two criteria. All tasks have an associated success level. The success level is given as a four point scale, where the four levels of performance are defined for each task. Where appropriate, the clinician also measures the time required to complete the task. Reliability has been established for most of the tasks, and the test has shown good internal consistency.

EvaluWare (Assistive Technology, Inc. www.assistivetech.com/p-evaluware.htm)
The EvaluWare software presents a series of assessment activities for AAC and computer access evaluations. Assessed skills include Looking Skills, Listening Skills, and Motor Skills. Motor skills

Assessment Tools, page 3, updated: June 8, 2006

include skills related to use of pointing devices, touch screens, keyboard, and switch. The software also assesses "related skills", including the ability to use an onscreen keyboard, ability to use word prediction, and ability to answer questions based on visual scenes. The tests are designed for children, and may not be age-appropriate for teenagers or adults with disabilities.
For most tasks, the clinician gives instructions to the client based on the pictures on the screen. For some tasks, instructions are given automatically (through voice output or onscreen text). In many tasks the software provides feedback (sound files, animations, etc). For all tasks, the clinician observes the client’s performance and records the results. The clinician can record results in the software or on hardcopy. Recording the results in software allowed the clinician to view, print, save reports. The software also allows the clinician to set preferences on visual and audio presentation of tests, cues, and feedback.
Assistive Technology, Inc. also offers the Stages assessment software. Stages is a framework for defining learners’ abilities as they develop cognitive and language skills. The assessment software assists in evaluating skills appropriate to each stage, ranging from cause and effect to working with money and composing sentences. Assessment tasks commonly consist of multiple choice questions, and reports commonly indicate the number of attempts required, time required, and whether the correct answer was achieved within three attempts.

Compass (Koester Performance Research, www.kpronline.com)
Compass software provides a range of assessment activities for computer access evaluations, and may also be useful for access questions in AAC evaluations (Koester and McMillan, 1998; LoPresti, Koester, and McMillan, 2002; Ashlock, Koester, et al., 2003). The software measures skills needed for computer interaction, such as keyboard and mouse use, navigating through menus, and single-switch scanning. Eight different skill tests are currently provided, each with a wide variety of configuration options to tailor test presentation to a client’s needs. The clinician selects the appropriate tests to run with each client, depending on the particular questions to be addressed in the evaluation. Compass collects and helps clinicians interpret speed and accuracy data resulting from each skill test. This allows the clinician to observe qualitative, subjective aspects of the client’s performance while the computer does what it does well: record quantitative data. Version 1.0 of Compass software is now available for Windows. A free 30-day trial of the software is available.

Custom Typing. (Custom Solutions, www.customtyping.com/at_assessment.htm)
The Custom Typing web site provides an on-line typing tutor, and also measures typing speed. It can be used to assess performance differences with different text input methods. Users define the test setup by specifying parameters such as the input device, device layout, font size, and text to be transcribed. During the typing task, the software records words per minute and accuracy (percent of correctly entered words). The report also displays what the user actually typed as compared to the transcription text. A comparison report is also available to compare performance between different test setups over time. Custom Typing can be used on a 4-week trial basis.
Custom Solutions also provides some free mouse skills tests (www.customsolutions.us). These are primarily target acquisition tests, using a variety of target sizes and shapes. The software records the total time to perform all tasks in the set. That is currently the only performance information presented, but the software is still under development.

Single Switch Performance Test. (AAC Institute, www.aacinstitute.org)
SSPT software measures the average time required to activate or release the switch (following visual and/or audio prompts) and also can measure the speed of repetitive activations. The user can enter the switch location and type that is being evaluated. The software records the average response time and the range of response times (min and max). SSPT is free of charge.

Assessment Tools, page 4, updated: June 8, 2006

Advisor Systems

A computer-based assessment tool is intended to assist a clinician in collecting and reporting data. The clinician can then use this data to make decisions about assistive technology prescription and document those decisions in funding requests or IEPs. Some systems go a step further, and automatically analyze performance data to make recommendations to a clinician. Several systems of this type have been researched over the years, but none appears to be commercially available as of June 2006.

The Computer Access Selector (Stapleton et al. 1997) is a computer-based prescription tool for computer access products. The software prompts a clinician with a series of possible device criteria. The clinician selects criteria appropriate for his or her client from the list of options. Based on the clinician input, the software searches a list of known devices for those which match criteria. The software then reports the devices which match the client’s needs, with details about the prescribed devices. This was previously available from Regency Park Rehabilitation Engineering (Regency Park, Australia, http://regencyrehab.cca.org.au/).

Performance Monitoring and Outcome Measures

Although the tests listed above incorporate the computer into the assessment process, none of the currently commercially available assessment tools use automatic data logging. A tool which automatically records quantitative data can facilitate the production of automated reports, and would also free the clinician to observe qualitative, subjective aspects of the client’s performance while the computer does what it does well: record quantitative data. At least two methods have been proposed to monitor human performance in the related area of augmentative and alternative communication outcomes assessment.

Language Activity Monitor (AAC Institute, www.aacinstitute.org)
The Language Activity Monitor (LAM) records data related to a person’s use of an augmentative and alternative communication (AAC) device (Hill and Romich 1999; Romich and Hill 1999). The LAM is a hardware monitor that plugs into the serial port of a dedicated AAC device or general-purpose computer. The LAM reads the characters, words, or phrases selected by the user on the AAC device. This device output is recorded to an ASCII text file with a time stamp. A sample output might look like:
11:20:50 "Hello"
11:21:35 "there"
with a timestamp followed by the AAC output, and each output presented on a separate line. The LAM is available as a built-in component of current Prentke Romich Corporation AAC devices. It is also available as software that runs on a computer and collects language output from a computer-based AAC system. Software is also available to upload data from a dedicated AAC device (such as the Prentke Romich products with built-in LAM) to a computer for analysis.

ACQUA. (Enkidu Labs, Dynavox Technologies,/www.dynavoxsys.com/Default.aspx?tabid=361)
Another data logging format has been proposed which provides more detailed information (Higginbotham et al. 1999). The format is similar to (and compatible with) the format used by the LAM. It also provides the option to record information such as: the user activity that prompted a logfile entry (e.g. mouse clicks, keypresses, switch closures, elapsed dwell time); the input device used for the action (e.g. mouse or keyboard); and the active page, folder, or screen. A number of other information types can be recorded.
This data format is implemented as an optional data logging feature in Enkidu’s Impact AAC software. The data log format is freely available for use by other manufacturers. ACQUA is a separate software application that reads a log file of the format used by Impact and calculates a number of statistics related to language activity. These include characters or words per unit time, or the number of selections per character. ACQUA can also translate the log data into a tab-delimited file readable by software such as Excel or SPSS.

Assessment Tools, page 5, updated: June 8, 2006

References
Anson, D. (1997). Alternative Computer Access: A Guide to Selection. F.A. Davis Company, Philadelphia.
Ashlock, G., Koester, H., LoPresti, E., McMillan, W., Simpson, R. (2003). User-centered Design of Software for Assessing Computer Usage Skills.
26th Annual Conference on Rehabilitation Engineering (RESNA), Atlanta, GA, June 2003.
Chambers, A.C. (1997). CASE/TAM Assistive Technology Policy and Practice Series: Has Technology Been Considered? A Guide for IEP Teams. Reston, VA: Council of Administrators of Special Education and the Technology and Media Division of the Council for Exceptional Children.
Dumont, C., Dionne, C. (2000). Validation d’un instrument de mesure pour evaluer l’acces a l’ordinateur chez les personnes ayant une deficience physique.
Canadian Journal of Occupational Therapy. 67(3):173-183.
Dumont, C., Vincent C., Mazer B. (2002). Development of a Standardized Instrument to Assess Computer Task Performance.
American Journal of Occupational Therapy. 56(1):60-68.
Higginbotham, DJ, Lesher, GW, Moulton, BJ. (1999). Development of a Voluntary Standard Format for Augmentative Communication Device Logfiles.
Proceedings of the RESNA '99 Annual Conference, 25-27, Arlington, VA: RESNA Press.
Hill, K.J. & Romich, B.A. (1999). A proposed standard for AAC and writing system data logging for clinical intervention, outcomes measurement, and research.
Proceedings of the RESNA '99 Annual Conference, 22-24, Arlington, VA: RESNA Press.
Koester, H.H. and McMillan, W.W. (1997). Software for Assessing Computer Usage Skills.
20th Annual Conference on Rehabilitation Engineering (RESNA), Pittsburgh, PA.
Lee, K.S., Thomas, D.J. (1990). Control of Computer-Based Technology for People with Physical Disabilities: An Assessment Manual. University of Toronto Press, Toronto.
LoPresti, E., Koester, H., and McMillan, W. (2002). Tools for Assessing Computer Access Skills.
Proceedings of ASSETS 2002, New York: ACM.
Patten, F.M., Stemach, G. (1998). Lifespace Access Profile©: Assistive Technology Assessment and Planning for Individuals with Severe or Multiple Disabilities (Original Protocol) and the Upper Extension for Individuals with Physical Disabilities. Now Offered in both Macintosh and Windows ’95 Versions.
Proc. CSUN 1998.
Reed, P. Bowser, G. (1998). Education Tech Points: A framework for assistive technology planning and systems change in schools.
Proc. CSUN 1998, Los Angeles, CA.
Romich, B.A. & Hill, K.J. (1999). A language activity monitor for AAC and writing systems: Clinical intervention, outcomes measurement, and research.
Proceedings of the RESNA '99 Annual Conference, 19-21, Arlington, VA: RESNA Press.
Stapleton, D, Garrett, R, Seeger, BR. (1997). VOCAselect and Computer Access Selector: Software Tools to Assist in Choosing Assistive Technology.
Proceedings of the 1997 Resna Annual Conference, Pittsburgh, PA, RESNA Press, 351-353.
Williams, W.B., Stemach, G., Wolf, S., Stanger, C. (1995). Lifespace Access Profile: Assistive Technology Assessment and Planning for Individuals with Severe or Multiple Disabilities. Irvine, CA: Lifespace Access Assistive Technology Systems.

Assessment Tools, page 6, updated: June 8, 2006